Preliminaries

Norm of a Matrix $A(n \times m)$

$\|A\| = \max_{\|\underline{x}\|=1}\, \|A\underline{x}\| = \sqrt{\lambda_\max(A^TA)}$

Properties

$\|A\underline{x}\| \leq \|A\|\|\underline{x}\|$ $\|AB\| \leq \|A\|\|B\|$ $\|A + B\| \leq \|A\| + \|B\|$

Additional Facts About Matrices

  • $A(n \times m)$ is a non-singular if and only if $\det(A) \neq 0$
  • Let $A$, $B$, $C$ be square $(n \times n)$. If $\det(C) \neq 0 \Rightarrow \det(C) = \det(A)\, \det(B)$, where $A \neq 0,\, \det(B) \neq 0$ where $C = AB$

Convergence for a Series of Functions

Consider the following series:

$\displaystyle \sum_{j=0}^{\infty} f_j(t)$ where $f_j(t)$ is defined in $t \in [t_0, t_1]$

The infinite series of functions is convergent in $[t_0, t_1]$ iff:

  • $\|f_j(t)\| \leq \alpha_j$ for all $t \in [t_0, t_1]$ and $\forall j$
  • The series $\sum_{j=1}^{\infty} \alpha_j < \infty$ is convergent

Solution of Autonomous Systems of Differential Equations (State Equation for LTI Systems)

We are focusing on the state equation representing the dynamics of autonomous systems (LTI)

$\dot{\underline{x}} = A \underline{x} + B \underline{u}, \quad A(n\times n), \, B(n \times m)$

Solution of $\dot{\underline{x}} = A \underline{x}, \, (\underline{u} = 0)$

IV Problem (Cauchy Problem)

$\begin{cases} \dot{\underline{x}} = A \underline{x} \\ \underline{x}(0) = \underline{x}_0 \rightarrow IC \end{cases}$

Formally we can interpret the equation directly:

$\displaystyle \underline{x}(t) = \underline{x}_0 + \int_0^t A\underline{x}(\tau) \partial \tau = \underline{x}_0 + A \int_0^t \underline{x}(\tau) \partial \tau$

Formally, $\underline{x}(t) = \underline{x}_0 + A \int_0^t \underline{x}(\tau)\partial\tau$ is an integral equation.

Formal (theoretical) approach: Successive Approximations

Start from $\underline{x}_0$ and compute $\underline{x}_1(t)$ (approximation):

$\displaystyle \underline{x}_1(t) = \underline{x}_0 + A \int_0^t \underline{x}_0(\tau) \partial\tau + A\underline{x}_0(t)$

then move forward

$\displaystyle \underline{x}_2(t) = \underline{x}_0 + A \int_0^t \underline{x}_1(\tau)\partial\tau = \underline{x}_0 + t A \underline{x}_0 + \frac{t^2}{2} A^2 \underline{x}_0$

$\displaystyle \underline{x}_3(t) = \underline{x}_0 + A \int_0^t \underline{x}_2(\tau)\partial\tau = \underline{x}_0 + t A \underline{x}_0 + \frac{t^2}{2} A^2 \underline{x}_0 + \frac{t^3}{6} A^3 \underline{x}_0$

Eventually we get the following:

$\displaystyle \underline{x}_k(t) = \sum_{j=0}^k \frac{t^j}{j!}A^j\underline{x}_0, \quad \text{where } A^0 = I \text{ (identity)}$

Question: Does the Series Converge? $(k \rightarrow \infty)$

Let's apply the theorem of convergence $\rightarrow$ Look at the $\underline{\text{bound}}$ for the coefficient of the series.

$\displaystyle\left\|\frac{t^j}{j!} A^j \underline{x}_0 \right \| \leq \frac{|t|^j}{j!} \|A^j\| \|\underline{x}_0\| \leq \frac{\big(T\,\|A\|\big)^j}{j!} \|\underline{x}_0\| \text{ if } t\in[-T,T] \leftarrow |t|^j \leq T^j$

Call $\alpha_j = \frac{\big(T\|A\|\big)^j}{j!}\|\underline{x}_0\|$ and look at $\sum_{j=1}^{\infty} \alpha_j$

$\displaystyle \sum_{j=0}^{\infty} \alpha_j = \left(\sum_{j=0}^{\infty} \frac{\big(T\|A\|\big)^j}{j!}\right) \|\underline{x}_0\| = e^{T\|A\|}\|\underline{x}_0\|$

Remember: Taylor series for the exponential is $e^x = \sum_{j=0}{\infty} \frac{x^j}{j!}$ thus we have that the series $\sum_{j=0}^{\infty} \frac{t^j}{j!} A^j \underline{x_0}$ is convergent.

Question: Is $\sum_{j=0}^{\infty} \frac{t^j}{j!} A^j \underline{x}_0$ the solution of $\begin{cases} \dot{\underline{x}} = A \underline{x} \\ \underline{x}(0) = \underline{x}_0 \end{cases}$ ?

Definition: For a matrix $M$, the exponential matrix $e^M = \exp(M)$ is defined via the convergent infinite series:

$$e^M = \sum_{j=0}^{\infty} \frac{1}{j!}M^j$$

General Fact: For a given function $f(x)$ and a matrix $A(n \times x)$, $f(A)$ is defined via the infinite series:

$\displaystyle f(x) = \sum_{j=0}^\infty \frac{f^j(0)}{j!}x^j \Rightarrow \sum_{j=0}^\infty \frac{f^j(0)}{j!} A^j$

Thus:

$\displaystyle \begin{cases} \dot{\underline{x}} = A \underline{x} \\ \underline{x}(0) = \underline{x}_0 \end{cases} \Rightarrow \underline{x}(t) = e^{At}\underline{x} \text{ where, } e^{At} = \sum_{j=0}^\infty \frac{t^j}{j!} A^j$

Properties of $e^{At}$:

  • $e^0 = I$
  • $\frac{\partial}{\partial t} e^{At} = A e ^{At}$

Solution of $\begin{cases}\dot{\underline{x}} = A\underline{x} + B\underline{u} \\ \underline{x}(0) = \underline{x}_0\end{cases}$ :

Claim: $\displaystyle \underline{x}(t) = e^{At}\underline{x}_0 + \int_0^t d^{A(t - \tau}B \underline{u}(\tau) \partial \tau$

Let's see how we prove this is the solution:

  1. If satisfies $IC$:

    $\displaystyle\underline{x}(0) = e^{A0}\underline{x}_0 + \int_0^0 e^{A(t-\tau)} B\underline{u}(\tau)\partial\tau = \underline{x}_0$

  2. Take the derivative:

    $\displaystyle\dot{\underline{x}} = \frac{\partial}{\partial t} \big(e^{At}\underline{x}_0 + \int_0^t e^{A(t-\tau)}B\underline{u}(\tau) \partial\tau\big) = Ae^{At}\underline{x}_0 + \frac{\partial}{\partial t} \big(e^{A(t-\tau)}B\underline{u}(\tau)\partial\tau\big)$

    Leibitz (?????) formula: $\displaystyle\frac{\partial}{\partial t} \int_{\alpha(t)}^{\beta(t)} f(t, \tau)\partial \tau = f(t, \tau) \big |_\beta \beta'(t) - f(t, \tau)\big |_\alpha \alpha'(t) + \int_\alpha^\beta \frac{\partial f}{\partial t} \partial \tau$

    Apply:

    $\displaystyle\dot{\underline{x}} = A e^{At}\underline{x}_0 + e^A(t-\tau)B\underline{u}(\tau) \big|_{\tau = t} + \int_0^t A e^{A(t - \tau)} B \underline{u}(\tau)\partial\tau = A \big(\underbrace{e^{At}\underline{x}_0 + \int_0^t e^{A(t-\tau)} B\underline{u}(\tau)\partial\tau}_{\underline{x}(t)}\big) + B \underline{u}(t)$

    $\Rightarrow \dot{\underline{x}} = A\underline{x} + B\underline{u} \Rightarrow$ it satisfies the state equations.

Example #1

Find $e^{At}$ for $A = \begin{bmatrix}0 && 1 \\ 0 && 0 \end{bmatrix}$

$\displaystyle e^{At} = \sum_{j=0}^\infty \frac{t^j}{j!}A^j \Rightarrow A^2 = A \cdot A = \begin{bmatrix}0 && 1 \\ 0 && 0\end{bmatrix}\begin{bmatrix}0 && 1 \\ 0 && 0\end{bmatrix} = \begin{bmatrix}0 && 0 \\ 0 && 0\end{bmatrix}$

Thus $A^k = 0$ for $k \geq 2$.

$e^{At} = I + tA = \begin{bmatrix}1 && t \\ 0 && 1\end{bmatrix}$

Example #1

Find $a^{At}$ for $A = \begin{bmatrix}0 && 1 \\ -1 && 0 \end{bmatrix}$

$A^2 = \begin{bmatrix}0 && 1 \\ -1 && 0\end{bmatrix} \begin{bmatrix}0 && 1 \\ -1 && 0\end{bmatrix} = \begin{bmatrix}-1 && 0 \\ 0 && -1\end{bmatrix} = -I$

$A^3 = A^2 A = -A \Rightarrow A^4 = A^3A = (-A)(A) = -A^2 = I \Rightarrow A^5 = A^4A = A$

So we see a pattern here:

$\displaystyle A^k = \begin{cases} (-1)^{\frac{k-1}{2}}A, && k = 1, 3, 5, \dots \, odd \\ (-1)^{\frac{k}{2}}I, && k = 0, 2, 4, \dots \, even \end{cases}$

The $e^At$ can be split into odd and even:

$\displaystyle e^{At} = \sum_{j=0}^\infty \frac{t^j}{j!}A^j = \big( \underbrace{1 - \frac{t^2}{2!} + \frac{t^4}{4!} \dots}_{\cos t} \big) I + \big ( \underbrace{t - \frac{t^3}{3!} + \frac{t^5}{5!} - \dots}_{\sin t} \big) A$

$e^{At} = \cos(t)I + \sin(t)A = \begin{bmatrix}\cos{t} && 0 \\ 0 && \cos(t)\end{bmatrix} + \begin{bmatrix}0 && \sin(t) \\ -sin(t) && 0\end{bmatrix} = \begin{bmatrix}\cos(t) && \sin(t) \\ -\sin(t) && \cos(t)\end{bmatrix}$


In [ ]:


In [ ]: